27 research outputs found

    Solving monotone inclusions involving parallel sums of linearly composed maximally monotone operators

    Full text link
    The aim of this article is to present two different primal-dual methods for solving structured monotone inclusions involving parallel sums of compositions of maximally monotone operators with linear bounded operators. By employing some elaborated splitting techniques, all of the operators occurring in the problem formulation are processed individually via forward or backward steps. The treatment of parallel sums of linearly composed maximally monotone operators is motivated by applications in imaging which involve first- and second-order total variation functionals, to which a special attention is given.Comment: 25 page

    Inertial Douglas-Rachford splitting for monotone inclusion problems

    Full text link
    We propose an inertial Douglas-Rachford splitting algorithm for finding the set of zeros of the sum of two maximally monotone operators in Hilbert spaces and investigate its convergence properties. To this end we formulate first the inertial version of the Krasnosel'ski\u{\i}--Mann algorithm for approximating the set of fixed points of a nonexpansive operator, for which we also provide an exhaustive convergence analysis. By using a product space approach we employ these results to the solving of monotone inclusion problems involving linearly composed and parallel-sum type operators and provide in this way iterative schemes where each of the maximally monotone mappings is accessed separately via its resolvent. We consider also the special instance of solving a primal-dual pair of nonsmooth convex optimization problems and illustrate the theoretical results via some numerical experiments in clustering and location theory.Comment: arXiv admin note: text overlap with arXiv:1402.529

    A variable smoothing algorithm for solving convex optimization problems

    Get PDF
    Abstract. In this article we propose a method for solving unconstrained optimization problems with convex and Lipschitz continuous objective functions. By making use of the Moreau envelopes of the functions occurring in the objective, we smooth the latter to a convex and differentiable function with Lipschitz continuous gradient by using both variable and constant smoothing parameters. The resulting problem is solved via an accelerated first-order method and this allows us to recover approximately the optimal solutions to the initial optimization problem with a rate of convergence of order O
    corecore